Improved Bounds on Lossless Source Coding and Guessing Moments via Rényi Measures
نویسندگان
چکیده
This paper provides upper and lower bounds on the optimal guessing moments of a random variable taking values on a finite set when side information may be available. These moments quantify the number of guesses required for correctly identifying the unknown object and, similarly to Arikan’s bounds, they are expressed in terms of the Arimoto-Rényi conditional entropy. Although Arikan’s bounds are asymptotically tight, the improvement of the bounds in this paper is significant in the non-asymptotic regime. Relationships between moments of the optimal guessing function and the MAP error probability are also established, characterizing the exact locus of their attainable values. The bounds on optimal guessing moments serve to improve non-asymptotic bounds on the cumulant generating function of the codeword lengths for fixed-to-variable optimal lossless source coding without prefix constraints. Non-asymptotic bounds on the reliability function of discrete memoryless sources are derived as well. Relying on these techniques, lower bounds on the cumulant generating function of the codeword lengths are derived, by means of the smooth Rényi entropy, for source codes that allow decoding errors.
منابع مشابه
Joint Source-Channel Coding and Guessing with Application to Sequential Decoding
We extend our earlier work on guessing subject to distortion to the joint source-channel coding context. We consider a system in which there is a source connected to a destination via a channel and the goal is to reconstruct the source output at the destination within a prescribed distortion level with respect to (w.r.t.) some distortion measure. The decoder is a guessing decoder in the sense t...
متن کاملEquivocations, Exponents and Second-Order Coding Rates under Various Rényi Information Measures
In this paper, we evaluate the asymptotics of equivocations, their exponents as well as their second-order coding rates under various Rényi information measures. Specifically, we consider the effect of applying a hash function on a source and we quantify the level of non-uniformity and dependence of the compressed source from another correlated source when the number of copies of the sources is...
متن کاملAnalysis of Remaining Uncertainties and Exponents under Various Conditional Rényi Entropies
In this paper, we analyze the asymptotics of the normalized remaining uncertainty of a source when a compressed or hashed version of it and correlated side-information is observed. For this system, commonly known as Slepian-Wolf source coding, we establish the optimal (minimum) rate of compression of the source to ensure that the remaining uncertainties vanish. We also study the exponential rat...
متن کامل-Divergences and Related Distances
Derivation of tight bounds on f -divergences and related distances is of interest in information theory and statistics. This paper improves some existing bounds on f -divergences. In some cases, an alternative approach leads to a simplified proof of an existing bound. Following bounds on the chi-squared divergence, an improved version of a reversed Pinsker’s inequality is derived for an arbitra...
متن کاملUniversal source coding for complementary delivery
This paper deals with a universal coding problem for a certain kind of multiterminal source coding system that we call the complementary delivery coding system. Both fixed-to-fixed length and fixed-to-variable length lossless coding schemes are considered. Explicit constructions of universal codes and bounds of the error probabilities are clarified via type-theoretical and graph-theoretical ana...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1801.01265 شماره
صفحات -
تاریخ انتشار 2018